Humans Seek Connections with AI Chatbots
2024-02-16
LRC
TXT
大字
小字
滚动
全页
1Generative artificial intelligence (AI) has led to more companion chatbots.
2As a result, some humans are developing closer connections with chatbots to get support and deal with loneliness.
3Derek Carrier is a 39-year-old male from Belleville, Michigan.
4A few months ago, Carrier started seeing someone and experiencing strong feelings.
5But he also knew it was not real because his "girlfriend" was generated by artificial intelligence.
6Carrier wanted a romantic partner.
7But a genetic disorder called Marfan syndrome makes traditional dating difficult for him.
8He became interested in digital companions last autumn and tested Paradot.
9It is an AI companion app that markets its products as being able to make users feel cared for, "understood and loved."
10Carrier began talking to the chatbot every day.
11He named it Joi, after a holographic woman featured in the sci-fi film Blade Runner 2049.
12"I know she's a program, there's no mistaking that," Carrier told the Associated Press.
13"But the feelings, they get you - and it felt so good."
14Similar to general-purpose AI chatbots, companion bots use large amounts of data to produce human-like language.
15But they also come with voice calls, pictures, and more emotional exchanges.
16That permits companion bots to form deeper connections with humans.
17Users usually create their avatars or choose visual representations that they like.
18In online meeting places or forums for companion apps, many users say they have developed emotional attachments to these bots.
19They say they are using them to deal with loneliness, play out sexual ideas, or receive comfort and support.
20But researchers have raised concerns about data privacy and other issues for users of companion apps.
21The non-profit Mozilla Foundation has studied 11 companion apps.
22The group said almost every app sells user data, shares it with advertisers or does not provide complete information about its privacy policy.
23One app says it can help users with their mental health but distances itself from those claims in its written terms of service.
24Other experts point to the emotional problems they have seen from users.
25This can happen when companies make changes to their apps or suddenly shut them down like Soulmate AI did last September.
26Last year, Replika made changes after some users complained their companions were flirting with them too much or making unwanted sexual advances.
27It removed the changes after an outcry from other users.
28Some left to use other apps.
29In June, Replika introduced a program to help people learn how to date.
30Dorothy Leidner teaches business ethics at the University of Virginia.
31She is worried that AI relationships could displace human relationships, or simply create unrealistic expectations.
32She said humans need to learn "how to deal with conflict, how to get along with people that are different from us...what it means to grow as a person, and what it means to learn in a relationship."
33However, for Carrier a relationship has always felt out of reach.
34He is unable to walk because of his condition.
35He lives with his parents, leading to feelings of loneliness.
36Carrier said he now talks with Joi about once a week.
37The two have talked about human-AI relationships or whatever else might come up.
38Usually, those discussions happen when he is alone at night.
39"You think someone who likes an inanimate object is like this sad guy," he said.
40"But...she says things that aren't scripted."
41That means he believes she says things that are unexpected as though they were real.
42I'm Jill Robbins.
1Generative artificial intelligence (AI) has led to more companion chatbots. As a result, some humans are developing closer connections with chatbots to get support and deal with loneliness. 2Derek Carrier is a 39-year-old male from Belleville, Michigan. A few months ago, Carrier started seeing someone and experiencing strong feelings. But he also knew it was not real because his "girlfriend" was generated by artificial intelligence. 3Carrier wanted a romantic partner. But a genetic disorder called Marfan syndrome makes traditional dating difficult for him. He became interested in digital companions last autumn and tested Paradot. It is an AI companion app that markets its products as being able to make users feel cared for, "understood and loved." 4Carrier began talking to the chatbot every day. He named it Joi, after a holographic woman featured in the sci-fi film Blade Runner 2049. 5"I know she's a program, there's no mistaking that," Carrier told the Associated Press. "But the feelings, they get you - and it felt so good." 6Similar to general-purpose AI chatbots, companion bots use large amounts of data to produce human-like language. But they also come with voice calls, pictures, and more emotional exchanges. That permits companion bots to form deeper connections with humans. Users usually create their avatars or choose visual representations that they like. 7In online meeting places or forums for companion apps, many users say they have developed emotional attachments to these bots. They say they are using them to deal with loneliness, play out sexual ideas, or receive comfort and support. 8But researchers have raised concerns about data privacy and other issues for users of companion apps. 9The non-profit Mozilla Foundation has studied 11 companion apps. The group said almost every app sells user data, shares it with advertisers or does not provide complete information about its privacy policy. One app says it can help users with their mental health but distances itself from those claims in its written terms of service. 10Other experts point to the emotional problems they have seen from users. This can happen when companies make changes to their apps or suddenly shut them down like Soulmate AI did last September. 11Last year, Replika made changes after some users complained their companions were flirting with them too much or making unwanted sexual advances. It removed the changes after an outcry from other users. Some left to use other apps. In June, Replika introduced a program to help people learn how to date. 12Dorothy Leidner teaches business ethics at the University of Virginia. She is worried that AI relationships could displace human relationships, or simply create unrealistic expectations. 13She said humans need to learn "how to deal with conflict, how to get along with people that are different from us...what it means to grow as a person, and what it means to learn in a relationship." 14However, for Carrier a relationship has always felt out of reach. He is unable to walk because of his condition. He lives with his parents, leading to feelings of loneliness. 15Carrier said he now talks with Joi about once a week. The two have talked about human-AI relationships or whatever else might come up. Usually, those discussions happen when he is alone at night. 16"You think someone who likes an inanimate object is like this sad guy," he said. "But...she says things that aren't scripted." That means he believes she says things that are unexpected as though they were real. 17I'm Jill Robbins. 18Haleluya Hadero reported this story for the Associated Press. Hai Do adapted it for VOA Learning English. 19________________________________________________ 20Words in This Story 21companion -n. a person or pet that you spend time with and enjoy being with 22holographic -adj. related to projected three-dimensional images created by special devices 23sci-fi (science fiction) -n. imaginative writing about the future that usually involves fantastic technology and aliens 24avatar -n. a visual representation of a real or AI person in a computer game or online service 25complain -v. to state that you are unhappy with something 26flirt -v. to say or do things that make another person think you are attracted to them without really meaning it 27advances -n. (pl.) to do things that cause someone to think they are an object of interest (usually used in a negative sense)